Goto

Collaborating Authors

 AAAI AI-Alert for Jun 20, 2017


Is Artificial Intelligence Finally Coming into Its Own?

#artificialintelligence

In March the company bought a startup cofounded by Geoffrey Hinton, a University of Toronto computer science professor who was part of the team that won the Merck contest. Extending deep learning into applications beyond speech and image recognition will require more conceptual and software breakthroughs, not to mention many more advances in processing power. Programmers would train a neural network to detect an object or phoneme by blitzing the network with digitized versions of images containing those objects or sound waves containing those phonemes. A team led by Stanford computer science professor Andrew Ng and Google Fellow Jeff Dean showed the system images from 10 million randomly selected YouTube videos.


Homeless, assaulted, broke: drivers left behind as Uber promises change at the top

The Guardian

Yet as staff gathered on Tuesday morning at Uber's headquarters in San Francisco, there was one very conspicuous absence. "Let us address the elephant in the room," said Arianna Huffington, perhaps the most high-profile member of Uber's board. The answer: Travis Kalanick, Uber's 40-year-old co-founder and chief executive, was taking a leave of absence from the taxi-hailing app he has transformed into a global behemoth valued at almost $70bn. Huffington told Uber's staff that the company would not await Kalanick's return, choosing instead to act immediately on the findings of a damning investigation, accepted by the board, into the company's workplace culture amid claims of sexual harassment. "Uber is his life," she said of Kalanick. The embattled company had hit the reset button, without its controversial CEO, it would, Huffington declared, be "a new Uber".


Samsung's Bixby finally gets a voice -- sort of

USATODAY - Tech Top Stories

Now select users will get to test it. One of the most anticipated new features of the Samsung Galaxy S8 and S8 prior to the phone's launch in March was an artificial intelligence assistant named Bixby that you were supposed to be able to control by voice. Unfortunately while some of Bixby's capabilities made into onto the phones, the voice-based commands that would make Bixby respond more like the Google Assistant, Apple's Siri, Microsoft's Cortana and Amazon's Alexa was delayed, at least in the U.S. (Bixby is fully operational in South Korea, where Samsung is based). On Friday, Samsung Electronics America announced it will give "select" Galaxy S8 and S8 users early access to Bixby's vocal capabilities as part of what it still considered an early preview test. Samsung hasn't disclosed how many Bixby testers will gain access to this sneak preview, which will let you hold down a Bixby button and start speaking to get the phone to send texts, change settings and make calls.


AI Could Target Autism Before It Even Emerges--But It's No Cure-All

WIRED

Artificial intelligence is ascendant in medicine--from AI eye doctors to chatbot therapists. As medical databases balloon in size and complexity, researchers are teaching computers to sift through and identify patterns, hinting at a future in which machine learning algorithms diagnose disease all on their own. Sometimes, algorithms pick up on early signs of disease that humans wouldn't even know to look for. Last week, researchers at the University of North Carolina and Washington University reported an AI that can identify autistic infants long before they present behavioral symptoms. It's a thrilling opportunity: Early detection gives autism neuroscience a big leg up, as researchers try to understand what goes wrong during development.


Can my computer recognise my cat?

#artificialintelligence

In 2012, Google created an'artificial neural network' and fed it millions of pictures from the internet. Could your computer now identify your cat more accurately than you?


An Artificial Intelligence Developed Its Own Non-Human Language

#artificialintelligence

A buried line in a new Facebook report about chatbots' conversations with one another offers a remarkable glimpse at the future of language. In the report, researchers at the Facebook Artificial Intelligence Research lab describe using machine learning to train their "dialog agents" to negotiate. At one point, the researchers write, they had to tweak one of their models because otherwise the bot-to-bot conversation "led to divergence from human language as the agents developed their own language for negotiating." They had to use what's called a fixed supervised model instead. In other words, the model that allowed two bots to have a conversation--and use machine learning to constantly iterate strategies for that conversation along the way--led to those bots communicating in their own non-human language.


AI bots are learning to team up by wrangling digital swine in Minecraft

#artificialintelligence

Wrangling a pig--even a virtual one--is much easier if you get a friend to help. This much seems clear from a contest organized by Microsoft researchers to test how artificially intelligent agents could cooperate to solve tricky problems. How best to cooperate with your pig-wrangling pal is another question. The competition addresses an area of artificial intelligence that has had relatively little attention so far. AI researchers often develop software capable of performing a specific human task, such as playing chess or Go, and then measure it according to its ability to defeat a human player.


Stanford Robots Load a Hovering Drone, Solve a Marble Maze, and More

IEEE Spectrum Robotics

Robots that sketch, play ping pong, solve mazes, and attempt to juggle strutted their stuff at the annual demo day for Stanford's Experimental Robotics class. Each year, Professor Oussama Khatib's students aim to teach a selection of industrial robots some new skills. The toughest project: teaching a Kuka robot to juggle. The other projects included two robots that had been taught to draw (a Sawyer and a Puma), a Ping Pong–playing robot (the Kuka again) that scored a few points against its human opponents, a cowboy hat–wearing Sawyer robot that shot at a moving target, a Puma 500 robot that manipulated a maze to send a ball along the correct path, and a drone-loading Sawyer robot that tracked a less-than-stable hovering drone. Check them all out in the video below.


Man vs. Machine: Robot Calls Police After Being Attacked By Drunk Man

International Business Times

A drunk man reportedly ran into an armless K5 robot in the Knightscope parking lot in Mountain View, California and met his match. The April incident occurred after 41-year-old Jason Sylvain tipped over the 300-pound robot. Unfortunately, when the roving security robot found itself off-balance, the K5 called the police and signaled for help. The company spokesman Stacy Dean Stephens said that members of the robot company Knightscope -- which developed the robot that appears similar to the iconic Star Wars Droid R2D2 -- came out and detained Sylvain as the police came. Robo-Cops Are Now A Reality! Silicon Valley Gets KnightScope K5 Patrolling Robot… https://t.co/BiytOrxpPU